Dew Point vs. Relative Humidity

 

Simple formulas are fun and powerful things. My rule-of-thumb for handicapping runners based on age, sex, and weight is helpful in comparing people with one another. Likewise my empirical relationship between speed and distance, and my equation for how much poor pacing hurts during a race. I'm still looking for a good way to estimate the winning chances for each team in the middle of a baseball game.

My latest challenge: find a simple relationship between Dew Point and Relative Humidity. There are complex equations, but much better would be something that one could do in one's head while jogging along. Dew point is the more important parameter—it changes less throughout the day, and a high dew point is strongly correlated with uncomfortable conditions for exercising. Relative humidity is most often what's reported on the TV/radio.

At 100% relative humidity the dew point equals the current temperature. As the percent humidity falls, so does the dew point. Over a wide range of conditions, the difference between the temperature and the dew point (in °F) is 2 to 5 times the difference between the relative humidity and 100%. As an equation I propose:

Temperature - Dew Point = (100% - Relative Humidity) / 2.5

It's far from perfect, but it's simple and it's a start. For temperatures between 40°F and 100°F and relative humidity above 50% it gives the right answer to within 3°F. But the errors get rather bad at lower or higher temperatures and at low humidity levels. Can this formula be improved without making it too much more complicated?

^z - 2009-08-17

1 person liked this